Search results for "Computer Science - Computational Engineering"

showing 10 items of 13 documents

Nanoscale ear drum: graphene based nanoscale sensors.

2012

The difficulty in determining the mass of a sample increases as its size diminishes. At the nanoscale, there are no direct methods for resolving the mass of single molecules or nanoparticles and so more sophisticated approaches based on electromechanical phenomena are required. More importantly, one demands that such nanoelectromechanical techniques could provide not only information about the mass of the target molecules but also about their geometrical properties. In this sense, we report a theoretical study that illustrates in detail how graphene membranes can operate as nanoelectromechanical mass-sensor devices. Wide graphene sheets were exposed to different types and amounts of molecul…

Chemical Physics (physics.chem-ph)FOS: Computer and information sciencesCondensed Matter - Materials ScienceMaterials scienceDopantGrapheneDopingDetectorNanoparticleMaterials Science (cond-mat.mtrl-sci)FOS: Physical sciencesNanotechnologylaw.inventionComputational Engineering Finance and Science (cs.CE)Molecular dynamicslawDirect methodsPhysics - Chemical PhysicsGeneral Materials ScienceComputer Science - Computational Engineering Finance and ScienceNanoscopic scaleNanoscale
researchProduct

Improving computation efficiency using input and architecture features for a virtual screening application

2023

Virtual screening is an early stage of the drug discovery process that selects the most promising candidates. In the urgent computing scenario it is critical to find a solution in a short time frame. In this paper, we focus on a real-world virtual screening application to evaluate out-of-kernel optimizations, that consider input and architecture features to improve the computation efficiency on GPU. Experiment results on a modern supercomputer node show that we can almost double the performance. Moreover, we implemented the optimization using SYCL and it provides a consistent benefit with the CUDA optimization. A virtual screening campaign can use this gain in performance to increase the nu…

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesComputer Science - Distributed Parallel and Cluster ComputingHardware Architecture (cs.AR)Distributed Parallel and Cluster Computing (cs.DC)Computer Science - Computational Engineering Finance and ScienceComputer Science - Hardware Architecture
researchProduct

Seam Puckering Objective Evaluation Method for Sewing Process

2015

The paper presents an automated method for the assessment and classification of puckering defects detected during the preproduction control stage of the sewing machine or product inspection. In this respect, we have presented the possible causes and remedies of the wrinkle nonconformities. Subjective factors related to the control environment and operators during the seams evaluation can be reduced using an automated system whose operation is based on image processing. Our implementation involves spectral image analysis using Fourier transform and an unsupervised neural network, the Kohonen Map, employed to classify material specimens, the input images, into five discrete degrees of quality…

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesComputer Vision and Pattern Recognition (cs.CV)Computer Science - Computer Vision and Pattern RecognitionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONComputer Science - Computational Engineering Finance and Science
researchProduct

Reduced Order Models for Pricing European and American Options under Stochastic Volatility and Jump-Diffusion Models

2016

European options can be priced by solving parabolic partial(-integro) differential equations under stochastic volatility and jump-diffusion models like the Heston, Merton, and Bates models. American option prices can be obtained by solving linear complementary problems (LCPs) with the same operators. A finite difference discretization leads to a so-called full order model (FOM). Reduced order models (ROMs) are derived employing proper orthogonal decomposition (POD). The early exercise constraint of American options is enforced by a penalty on subset of grid points. The presented numerical experiments demonstrate that pricing with ROMs can be orders of magnitude faster within a given model p…

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesFOS: Economics and businessQuantitative Finance - Computational FinanceEuropean optionlinear complementary problemComputational Finance (q-fin.CP)reduced order modelAmerican optionComputer Science - Computational Engineering Finance and Scienceoption pricing
researchProduct

Modeling Business

2003

Business concepts are studied using a metamodel-based approach, using UML 2.0. The Notation Independent Business concepts metamodel is introduced. The approach offers a mapping between different business modeling notations which could be used for bridging BM tools and boosting the MDA approach.

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesJ.1Software_SOFTWAREENGINEERINGI.6.5Computer Science - Computational Engineering Finance and ScienceI.6.5; J.1
researchProduct

Efficient formulation of a two-noded geometrically exact curved beam element

2021

The article extends the formulation of a 2D geometrically exact beam element proposed by Jirasek et al. (2021) to curved elastic beams. This formulation is based on equilibrium equations in their integrated form, combined with the kinematic relations and sectional equations that link the internal forces to sectional deformation variables. The resulting first-order differential equations are approximated by the finite difference scheme and the boundary value problem is converted to an initial value problem using the shooting method. The article develops the theoretical framework based on the Navier-Bernoulli hypothesis, with a possible extension to shear-flexible beams. Numerical procedures …

Computational Engineering Finance and Science (cs.CE)FOS: Computer and information sciencesNumerical Analysiscurved beam geometrically exact nonlinear beam Kirchhoff beam large rotations planar frame shooting methodApplied MathematicsGeneral EngineeringComputer Science - Computational Engineering Finance and ScienceSettore ICAR/08 - Scienza Delle Costruzioni
researchProduct

Group Importance Sampling for particle filtering and MCMC

2018

Bayesian methods and their implementations by means of sophisticated Monte Carlo techniques have become very popular in signal processing over the last years. Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discus…

FOS: Computer and information sciencesComputer Science - Machine LearningComputer sciencePosterior probabilityMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisStatistics - Computation01 natural sciencesMachine Learning (cs.LG)Computational Engineering Finance and Science (cs.CE)Methodology (stat.ME)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceResampling0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputer Science - Computational Engineering Finance and ScienceStatistics - MethodologyComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMarkov chainApplied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingAlgorithmImportance samplingDigital Signal Processing
researchProduct

Semantics of UML 2.0 Activity Diagram for Business Modeling by Means of Virtual Machine

2005

The paper proposes a more formalized definition of UML 2.0 Activity Diagram semantics. A subset of activity diagram constructs relevant for business process modeling is considered. The semantics definition is based on the original token flow methodology, but a more constructive approach is used. The Activity Diagram Virtual machine is defined by means of a metamodel, with operations defined by a mix of pseudocode and OCL pre- and postconditions. A formal procedure is described which builds the virtual machine for any activity diagram. The relatively complicated original token movement rules in control nodes and edges are combined into paths from an action to action. A new approach is the us…

FOS: Computer and information sciencesComputer Science - Programming LanguagesSemantics (computer science)Computer scienceProgramming languageActivity diagramBusiness process modelingSecurity tokencomputer.software_genreMetamodelingComputational Engineering Finance and Science (cs.CE)Unified Modeling LanguageVirtual machineComputer Science - Computational Engineering Finance and SciencePseudocodecomputercomputer.programming_languageProgramming Languages (cs.PL)
researchProduct

A Two-Stage Reconstruction of Microstructures with Arbitrarily Shaped Inclusions

2020

The main goal of our research is to develop an effective method with a wide range of applications for the statistical reconstruction of heterogeneous microstructures with compact inclusions of any shape, such as highly irregular grains. The devised approach uses multi-scale extended entropic descriptors (ED) that quantify the degree of spatial non-uniformity of configurations of finite-sized objects. This technique is an innovative development of previously elaborated entropy methods for statistical reconstruction. Here, we discuss the two-dimensional case, but this method can be generalized into three dimensions. At the first stage, the developed procedure creates a set of black synthetic …

FOS: Computer and information sciencesComputer science02 engineering and technologylcsh:Technology01 natural sciencesArticleComputational Engineering Finance and Science (cs.CE)0103 physical sciencesCluster (physics)Effective methodGeneral Materials ScienceComputer Science - Computational Engineering Finance and Sciencelcsh:Microscopy010306 general physicslcsh:QC120-168.85lcsh:QH201-278.5Pixellcsh:Tmulti-scale entropic descriptorsrandom heterogeneous materials021001 nanoscience & nanotechnologyMicrostructureStandard techniqueCement pastetwo-stage reconstructionlcsh:TA1-2040simulated annealing for clustersSimulated annealinglcsh:Descriptive and experimental mechanicslcsh:Electrical engineering. Electronics. Nuclear engineeringlcsh:Engineering (General). Civil engineering (General)0210 nano-technologylcsh:TK1-9971AlgorithmMaterials
researchProduct

Compressed Particle Methods for Expensive Models With Application in Astronomy and Remote Sensing

2021

In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) sc…

FOS: Computer and information sciencesComputer scienceAstronomyModel selectionBayesian inferenceMonte Carlo methodBayesian probabilityAerospace EngineeringAstronomyInferenceMachine Learning (stat.ML)Context (language use)Bayesian inferenceStatistics - ComputationComputational Engineering Finance and Science (cs.CE)remote sensingimportance samplingStatistics - Machine Learningnumerical inversionparticle filteringElectrical and Electronic EngineeringUncertainty quantificationApproximate Bayesian computationComputer Science - Computational Engineering Finance and ScienceComputation (stat.CO)IEEE Transactions on Aerospace and Electronic Systems
researchProduct